An Improved Gauss-Newtons Method based Back-propagation Algorithm for Fast Convergence

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Improved Gauss-Newtons Method based Back-propagation Algorithm for Fast Convergence

The present work deals with an improved back-propagation algorithm based on Gauss-Newton numerical optimization method for fast convergence. The steepest descent method is used for the back-propagation. The algorithm is tested using various datasets and compared with the steepest descent back-propagation algorithm. In the system, optimization is carried out using multilayer neural network. The ...

متن کامل

An Improved Learning Algorithm based on the Conjugate Gradient Method for Back Propagation Neural Networks

The conjugate gradient optimization algorithm usually used for nonlinear least squares is presented and is combined with the modified back propagation algorithm yielding a new fast training multilayer perceptron (MLP) algorithm (CGFR/AG). The approaches presented in the paper consist of three steps: (1) Modification on standard back propagation algorithm by introducing gain variation term of th...

متن کامل

An improved opposition-based Crow Search Algorithm for Data Clustering

Data clustering is an ideal way of working with a huge amount of data and looking for a structure in the dataset. In other words, clustering is the classification of the same data; the similarity among the data in a cluster is maximum and the similarity among the data in the different clusters is minimal. The innovation of this paper is a clustering method based on the Crow Search Algorithm (CS...

متن کامل

An Improved Conjugate Gradient Based Learning Algorithm for Back Propagation Neural Networks

The conjugate gradient optimization algorithm is combined with the modified back propagation algorithm to yield a computationally efficient algorithm for training multilayer perceptron (MLP) networks (CGFR/AG). The computational efficiency is enhanced by adaptively modifying initial search direction as described in the following steps: (1) Modification on standard back propagation algorithm by ...

متن کامل

An Adaptive Training Method of Back - Propagation Algorithm

Currently, the back-propagation is the most widely applied neural network algorithm at present. However, its slow learning speed and local minima problem are often cited as the major weakness of the algorithm. In this paper, described are an adaptive training algorithm based on selective retraining of patterns through error analysis, and dynamic adaptation of learning rate and momentum through ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: International Journal of Computer Applications

سال: 2012

ISSN: 0975-8887

DOI: 10.5120/4837-7097